Classification of K-Pop Dance Movements Based on Skeleton Information Obtained by a Kinect Sensor
نویسندگان
چکیده
This paper suggests a method of classifying Korean pop (K-pop) dances based on human skeletal motion data obtained from a Kinect sensor in a motion-capture studio environment. In order to accomplish this, we construct a K-pop dance database with a total of 800 dance-movement data points including 200 dance types produced by four professional dancers, from skeletal joint data obtained by a Kinect sensor. Our classification of movements consists of three main steps. First, we obtain six core angles representing important motion features from 25 markers in each frame. These angles are concatenated with feature vectors for all of the frames of each point dance. Then, a dimensionality reduction is performed with a combination of principal component analysis and Fisher's linear discriminant analysis, which is called fisherdance. Finally, we design an efficient Rectified Linear Unit (ReLU)-based Extreme Learning Machine Classifier (ELMC) with an input layer composed of these feature vectors transformed by fisherdance. In contrast to conventional neural networks, the presented classifier achieves a rapid processing time without implementing weight learning. The results of experiments conducted on the constructed K-pop dance database reveal that the proposed method demonstrates a better classification performance than those of conventional methods such as KNN (K-Nearest Neighbor), SVM (Support Vector Machine), and ELM alone.
منابع مشابه
Dance Pose Identification from Motion Capture Data: A Comparison of Classifiers "2279
In this paper, we scrutinize the effectiveness of classification techniques in recognizing dance types based on motion-captured human skeleton data. In particular, the goal is to identify poses which are characteristic for each dance performed, based on information on body joints, acquired by a Kinect sensor. The datasets used include sequences from six folk dances and their variations. Multipl...
متن کاملAudio-Visual Beat Tracking Based on a State-Space Model for a Robot Dancer Performing with a Human Dancer
This paper presents a real-time beat-tracking method that integrates audio and visual information in a probabilistic manner to enable a humanoid robot to dance in synchronization with music and human dancers. Most conventional music robots have focused on either music audio signals or movements of human dancers to detect and predict beat times in real time. Since a robot needs to record music a...
متن کاملTraining Classifiers with Shadow Features for Sensor-Based Human Activity Recognition
In this paper, a novel training/testing process for building/using a classification model based on human activity recognition (HAR) is proposed. Traditionally, HAR has been accomplished by a classifier that learns the activities of a person by training with skeletal data obtained from a motion sensor, such as Microsoft Kinect. These skeletal data are the spatial coordinates (x, y, z) of differe...
متن کاملSkeleton Tracking using Kinect Sensor & Displaying in 3D Virtual Scene
Current research on skeleton tracking techniques focus on image processing in conjunction with a video camera constrained by bones and joint movement detection limits. The paper proposed 3D skeleton tracking technique using a depth camera known as a Kinect sensor with the ability to approximate human poses to be captured, reconstructed and displayed 3D skeleton in the virtual scene using OPENNI...
متن کاملTracking Human-like Natural Motion Using Deep Recurrent Neural Networks
Kinect skeleton tracker is able to achieve considerable human body tracking performance in convenient and a low-cost manner. However, The tracker often captures unnatural human poses such as discontinuous and vibrated motions when self-occlusions occur. A majority of approaches tackle this problem by using multiple Kinect sensors in a workspace. Combination of the measurements from different se...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره 17 شماره
صفحات -
تاریخ انتشار 2017